Venkat Chandrasekaran and Michael I . Jordan
نویسندگان
چکیده
2 ∥∥∥(x∗ − x̃) + σ n z̃− δ∥∥2l2 s.t. δ ∈ C − x̃. Letting R1 and R2 denote orthogonal subspaces that contain Q1 and Q2, i.e., Q1 ⊆ R1 and Q2 ⊆ R2, and letting δ = PR1(δ), δ (2) = PR2(δ), δ̂ (1) n (C) = PR1(δ̂n(C)), δ̂ (2) n (C) = PR2(δ̂n(C)) denote the projections of δ, δ̂n(C) onto R1, R2, we can rewrite the above reformulated optimization problem as: [ δ̂ (1) n (C), δ̂ (2) n (C) ] = arg min δ∈Q1,δ∈Q2 1 2 ∥∥∥PR1 [(x∗ − x̃) + σ n z̃]− δ(1)∥∥2l2
منابع مشابه
Computational and Statistical Tradeoffs via Convex Relaxation
Modern massive datasets create a fundamental problem at the intersection of the computational and statistical sciences: how to provide guarantees on the quality of statistical inference given bounds on computational resources, such as time or space. Our approach to this problem is to define a notion of "algorithmic weakening," in which a hierarchy of algorithms is ordered by both computational ...
متن کاملRejoinder: Latent Variable Graphical Model Selection via Convex Optimization by Venkat Chandrasekaran,
متن کامل
Convex Graph Invariants ∗ Venkat Chandrasekaran
The structural properties of graphs are usually characterized in terms of invariants, which are functions of graphs that do not depend on the labeling of the nodes. In this paper we study convex graph invariants, which are graph invariants that are convex functions of the adjacency matrix of a graph. Some examples include functions of a graph such as the maximum degree, the MAXCUT value (and it...
متن کاملApproximation of Jordan homomorphisms in Jordan Banach algebras RETRACTED PAPER
In this paper, we investigate the generalized Hyers-Ulam stability of Jordan homomorphisms in Jordan Banach algebras for the functional equation begin{align*} sum_{k=2}^n sum_{i_1=2}^ksum_{i_2=i_{1}+1}^{k+1}cdotssum_{i_n-k+1=i_{n-k}+1}^n fleft(sum_{i=1,i not=i_{1},cdots ,i_{n-k+1}}^n x_{i}-sum_{r=1}^{n-k+1} x_{i_{r}}right) + fleft(sum_{i=1}^{n}x_{i}right)-2^{n-1} f(x_{1}) =0, end{align*} where ...
متن کاملStatistical Inference in Sparse High-dimensional Models: Theoretical and computational challenges
In density estimation, we show that our procedure allows to recover (at least, when the number of observations is large enough) the celebrated maximum likelihood estimator when the model is regular enough and contains the true density. When the latter condition is not satisfied, we show that our procedure is robust (with respect to the Hellinger distance) while the maximum likelihood estimator ...
متن کامل